Local Rademacher Complexity-based Learning Guarantees for Multi-Task Learning

نویسندگان

  • Niloofar Yousefi
  • Yunwen Lei
  • Marius Kloft
  • Mansooreh Mollaghasemi
  • Georgios Anagnastapolous
چکیده

Niloofar Yousefi, Yunwen Lei , Marius Kloft Mansooreh Mollaghasemi and Georgios Anagnostopoulos Department of Electrical Engineering and Computer Science, University of Central Florida Department of Mathematics, City University of Hong Kong Department of Computer Science, Humboldt University of Berlin Department of Industrial Engineering & Management Systems, University of Central Florida Department of Electrical and Computer Engineering, Florida Institute of Technology

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The Rademacher Complexity of Linear Transformation Classes

Bounds are given for the empirical and expected Rademacher complexity of classes of linear transformations from a Hilbert space H to a …nite dimensional space. The results imply generalization guarantees for graph regularization and multi-task subspace learning. 1 Introduction Rademacher averages have been introduced to learning theory as an e¢ cient complexity measure for function classes, mot...

متن کامل

Rademacher Complexity Margin Bounds for Learning with a Large Number of Classes

This paper presents improved Rademacher complexity margin bounds that scale linearly with the number of classes as opposed to the quadratic dependence of existing Rademacher complexity margin-based learning guarantees. We further use this result to prove a novel generalization bound for multi-class classifier ensembles that depends only on the Rademacher complexity of the hypothesis classes to ...

متن کامل

Multi-Class Deep Boosting

We present new ensemble learning algorithms for multi-class classification. Our algorithms can use as a base classifier set a family of deep decision trees or other rich or complex families and yet benefit from strong generalization guarantees. We give new data-dependent learning bounds for convex ensembles in the multiclass classification setting expressed in terms of the Rademacher complexiti...

متن کامل

Learning Kernels Using Local Rademacher Complexity

We use the notion of local Rademacher complexity to design new algorithms for learning kernels. Our algorithms thereby benefit from the sharper learning bounds based on that notion which, under certain general conditions, guarantee a faster convergence rate. We devise two new learning kernel algorithms: one based on a convex optimization problem for which we give an efficient solution using exi...

متن کامل

Multi-Task Learning Using Neighborhood Kernels

This paper introduces a new and effective algorithm for learning kernels in a Multi-Task Learning (MTL) setting. Although, we consider a MTL scenario here, our approach can be easily applied to standard single task learning, as well. As shown by our empirical results, our algorithm consistently outperforms the traditional kernel learning algorithms such as uniform combination solution, convex c...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1602.05916  شماره 

صفحات  -

تاریخ انتشار 2016